1,286 research outputs found

    First-order logic learning in artificial neural networks

    Get PDF
    Artificial Neural Networks have previously been applied in neuro-symbolic learning to learn ground logic program rules. However, there are few results of learning relations using neuro-symbolic learning. This paper presents the system PAN, which can learn relations. The inputs to PAN are one or more atoms, representing the conditions of a logic rule, and the output is the conclusion of the rule. The symbolic inputs may include functional terms of arbitrary depth and arity, and the output may include terms constructed from the input functors. Symbolic inputs are encoded as an integer using an invertible encoding function, which is used in reverse to extract the output terms. The main advance of this system is a convention to allow construction of Artificial Neural Networks able to learn rules with the same power of expression as first order definite clauses. The system is tested on three examples and the results are discussed

    Universal Image Steganalytic Method

    Get PDF
    In the paper we introduce a new universal steganalytic method in JPEG file format that is detecting well-known and also newly developed steganographic methods. The steganalytic model is trained by MHF-DZ steganographic algorithm previously designed by the same authors. The calibration technique with the Feature Based Steganalysis (FBS) was employed in order to identify statistical changes caused by embedding a secret data into original image. The steganalyzer concept utilizes Support Vector Machine (SVM) classification for training a model that is later used by the same steganalyzer in order to identify between a clean (cover) and steganographic image. The aim of the paper was to analyze the variety in accuracy of detection results (ACR) while detecting testing steganographic algorithms as F5, Outguess, Model Based Steganography without deblocking, JP Hide&Seek which represent the generally used steganographic tools. The comparison of four feature vectors with different lengths FBS (22), FBS (66) FBS(274) and FBS(285) shows promising results of proposed universal steganalytic method comparing to binary methods

    The complexity and generality of learning answer set programs

    No full text
    Traditionally most of the work in the field of Inductive Logic Programming (ILP) has addressed the problem of learning Prolog programs. On the other hand, Answer Set Programming is increasingly being used as a powerful language for knowledge representation and reasoning, and is also gaining increasing attention in industry. Consequently, the research activity in ILP has widened to the area of Answer Set Programming, witnessing the proposal of several new learning frameworks that have extended ILP to learning answer set programs. In this paper, we investigate the theoretical properties of these existing frameworks for learning programs under the answer set semantics. Specifically, we present a detailed analysis of the computational complexity of each of these frameworks with respect to the two decision problems of deciding whether a hypothesis is a solution of a learning task and deciding whether a learning task has any solutions. We introduce a new notion of generality of a learning framework, which enables us to define a framework to be more general than another in terms of being able to distinguish one ASP hypothesis solution from a set of incorrect ASP programs. Based on this notion, we formally prove a generality relation over the set of existing frameworks for learning programs under answer set semantics. In particular, we show that our recently proposed framework, Context-dependent Learning from Ordered Answer Sets, is more general than brave induction, induction of stable models, and cautious induction, and maintains the same complexity as cautious induction, which has the highest complexity of these frameworks

    Labelled Natural Deduction for Substructural Logics

    No full text
    In this paper a uniform methodology to perform Natural Deduction over the family of linear, relevance and intuitionistic logics is proposed. The methodology follows the Labelled Deductive Systems (LDS) discipline, where the deductive process manipulates declarative units { formulas labelled according to a labelling algebra. In the system de-scribed here, labels are either ground terms or variables of a given labelling language and inference rules manipulate formulas and labels simultaneously, generating (whenever necessary) constraints on the labels used in the rules. A set of natural deduction style inference rules is given, and the notion of a derivation is dened which associates a la-belled natural deduction style \structural derivation " with a set of generated constraints. Algorithmic procedures, based on a technique called resource abduction, are dened to solve the constraints generated within a derivation, and their termination conditions dis-cussed. A natural deduction derivation is correct with respect to a given substructural logic, if, under the condition that the algorithmic procedures terminate, the associated set of constraints is satised with respect to the underlying labelling algebra. This is shown by proving that the natural deduction system is sound and complete with respect to the LKE tableaux system [DG94].

    Search space expansion for efficient incremental inductive logic programming from streamed data

    Get PDF
    In the past decade, several systems for learning Answer Set Programs (ASP) have been proposed, including the recent FastLAS system. Compared to other state-of-the-art approaches to learning ASP, FastLAS is more scalable, as rather than computing the hypothesis space in full, it computes a much smaller subset relative to a given set of examples that is nonetheless guaranteed to contain an optimal solution to the task (called an OPT-sufficient subset). On the other hand, like many other Inductive Logic Programming (ILP) systems, FastLAS is designed to be run on a fixed learning task meaning that if new examples are discovered after learning, the whole process must be run again. In many real applications, data arrives in a stream. Rerunning an ILP system from scratch each time new examples arrive is inefficient. In this paper we address this problem by presenting IncrementalLAS, a system that uses a new technique, called hypothesis space expansion, to enable a FastLAS-like OPT-sufficient subset to be expanded each time new examples are discovered. We prove that this preserves FastLAS's guarantee of finding an optimal solution to the full task (including the new examples), while removing the need to repeat previous computations. Through our evaluation, we demonstrate that running IncrementalLAS on tasks updated with sequences of new examples is significantly faster than re-running FastLAS from scratch on each updated task

    Labelled natural deduction for substructural logics

    Get PDF
    In this paper a uniform methodology to perform Natural Deduction over the family of linear, relevance and intuitionistic logics is proposed. The methodology follows the Labelled Deductive Systems (LDS) discipline, where the deductive process manipulates declarative units - formulas labelled according to a labelling algebra. In the system described here, labels are either ground terms or variables of a given labelling language and inference rules manipulate formulas and labels simultaneously, generating (whenever necessary) constraints on the labels used in the rules. A set of natural deduction style inference rules is given, and the notion of a derivation is defined which associates a labelled natural deduction style "structural derivation" with a set of generated constraints. Algorithmic procedures, based on a technique called resource abduction, are defined to solve the constraints generated within a derivation, and their termination conditions discussed. A natural deduction derivation is correct with respect to a given substructural logic, if, under the condition that the algorithmic procedures terminate, the associated set of constraints is satisfied with respect to the underlying labelling algebra. This is shown by proving that the natural deduction system is sound and complete with respect to the LKE tableaux system

    Використання семантико-стилістичного прийому словесної гри як засіб підвищення виразності художнього тексту

    Get PDF
    У статті на матеріалі української мови розглядаються деякі аспекти каламбурного осмислення та обігравання різних типів семантико-структурних відношень фонетичного та лексичного мовних рівнів, а також їх використання з метою підвищення виразності художнього тексту. (The article deals with some aspects of punning interpretation of diverse types of semantic-and-structural units of phonetical and lexical levels of the Ukrainian language. Their role in increasing expressive abilities of fiction texts is also being investigated in the article.

    Machine comprehension of text using combinatory categorial grammar and answer set programs

    Get PDF
    We present an automated method for generating Answer Set Programs from narratives written in English and demonstrate how such a representation can be used to answer questions about text. The proposed approach relies on a transparent interface between the syntax and semantics of natural language provided by Combinatory Categorial Grammars to translate text into Answer Set Programs, hence creating a knowledge base that, together with background knowledge, can be queried
    corecore